LogDet Rank Minimization with Application to Subspace Clustering

نویسندگان

  • Zhao Kang
  • Chong Peng
  • Jie Cheng
  • Qiang Chen
چکیده

Low-rank matrix is desired in many machine learning and computer vision problems. Most of the recent studies use the nuclear norm as a convex surrogate of the rank operator. However, all singular values are simply added together by the nuclear norm, and thus the rank may not be well approximated in practical problems. In this paper, we propose using a log-determinant (LogDet) function as a smooth and closer, though nonconvex, approximation to rank for obtaining a low-rank representation in subspace clustering. Augmented Lagrange multipliers strategy is applied to iteratively optimize the LogDet-based nonconvex objective function on potentially large-scale data. By making use of the angular information of principal directions of the resultant low-rank representation, an affinity graph matrix is constructed for spectral clustering. Experimental results on motion segmentation and face clustering data demonstrate that the proposed method often outperforms state-of-the-art subspace clustering algorithms.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Subspace clustering based on low rank representation and weighted nuclear norm minimization

Subspace clustering refers to the problem of segmenting a set of data points approximately drawn from a union of multiple linear subspaces. Aiming at the subspace clustering problem, various subspace clustering algorithms have been proposed and low rank representation based subspace clustering is a very promising and efficient subspace clustering algorithm. Low rank representation method seeks ...

متن کامل

A Review on Fixed-Rank Representation for Supervised Learning Using Neural Network

This study focused on development and application of efficient algorithm for clustering and classification of supervised visual data. In machine learning clustering classification and clustering is most useful techniques in pattern recognition and computer vision. Existing techniques for subspace clustering makes use of rank minimization and spars based that are computationally expensive and ma...

متن کامل

Multi-View Subspace Clustering via Relaxed L1-Norm of Tensor Multi-Rank

In this paper, we address the multi-view subspace clustering problem. Our method utilize the circulant algebra for tensor, which is constructed by stacking the subspace representation matrices of different views and then shifting, to explore the high order correlations underlying multi-view data. By introducing a recently proposed tensor factorization, namely tensor-Singular Value Decomposition...

متن کامل

Scalable Nuclear-norm Minimization by Subspace Pursuit Proximal Riemannian Gradient

Trace-norm regularization plays a vital role in many learning tasks, such as low-rank matrix recovery (MR), and low-rank representation (LRR). Solving this problem directly can be computationally expensive due to the unknown rank of variables or large-rank singular value decompositions (SVDs). To address this, we propose a proximal Riemannian gradient (PRG) scheme which can efficiently solve tr...

متن کامل

Rank/Norm Regularization with Closed-Form Solutions: Application to Subspace Clustering

When data is sampled from an unknown subspace, principal component analysis (PCA) provides an effective way to estimate the subspace and hence reduce the dimension of the data. At the heart of PCA is the EckartYoung-Mirsky theorem, which characterizes the best rank k approximation of a matrix. In this paper, we prove a generalization of the Eckart-Young-Mirsky theorem under all unitarily invari...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره 2015  شماره 

صفحات  -

تاریخ انتشار 2015